Machine Learning Algorithm Comparison

This example illustrates fitting and comparing several Machine Learning algorithms for classifying the binary target in the
HMEQ data set.

The data set used for this pipeline is from a financial services company that offers a home equity line of credit. The company has extended several thousand lines of credit in the past, and many of these accepted applicants have defaulted on their loans. Using demographic and financial variables, the company wants to build a model to classify whether an applicant will default.

The target variable "BAD" indicates whether an applicant defaulted on the home equity line of credit.

The steps include:

  1. PREPARE AND EXPLORE
    a) Check data is loaded into CAS

  2. PERFORM SUPERVISED LEARNING
    a) Fit a model using a Random Forest
    b) Fit a model using Gradient Boosting
    c) Fit a model using a Neural Network
    d) Fit a model using a Support Vector Machine

  3. EVALUATE AND IMPLEMENT
    a) Score the data
    b) Assess model performance
    c) Generate ROC and Lift charts

Import packages


In [1]:
import pandas as pd
import swat
from matplotlib import pyplot as plt
from swat.render import render_html

%matplotlib inline

CAS Server connection details


In [2]:
indata = "hmeq"
indata_ext = ".sas7bdat"

Start CAS session


In [3]:
sess = swat.CAS("cas01", 19640)

Import action sets


In [4]:
sess.loadactionset(actionset="dataStep")
sess.loadactionset(actionset="dataPreprocess")
sess.loadactionset(actionset="cardinality")
sess.loadactionset(actionset="sampling")
sess.loadactionset(actionset="decisionTree")
sess.loadactionset(actionset="neuralNet")
sess.loadactionset(actionset="svm")
sess.loadactionset(actionset="astore")
sess.loadactionset(actionset="percentile")


NOTE: Added action set 'dataStep'.
NOTE: Added action set 'dataPreprocess'.
NOTE: Added action set 'cardinality'.
NOTE: Added action set 'sampling'.
NOTE: Added action set 'decisionTree'.
NOTE: Added action set 'neuralNet'.
NOTE: Added action set 'svm'.
NOTE: Added action set 'astore'.
NOTE: Added action set 'percentile'.
Out[4]:
§ actionset
percentile

elapsed 0.000214s · mem 0.048MB

Load data into CAS if needed


In [5]:
if not sess.table.tableExists(table=indata).exists:
    tbl = sess.upload_file(indata + indata_ext, casout={"name": indata})


NOTE: Cloud Analytic Services made the uploaded file available as table HMEQ in caslib CASUSER(kesmit).
NOTE: The table HMEQ has been created in caslib CASUSER(kesmit) from binary data uploaded to Cloud Analytic Services.

In [6]:
sess.tableinfo()


Out[6]:
§ TableInfo
Name Rows Columns Encoding CreateTimeFormatted ModTimeFormatted JavaCharSet CreateTime ModTime Global Repeated View SourceName SourceCaslib Compressed Creator Modifier
0 HMEQ 5960 13 utf-8 21Sep2016:15:46:51 21Sep2016:15:46:51 UTF8 1.790092e+09 1.790092e+09 0 0 0 0 kesmit

elapsed 0.00101s · user 0.000999s · mem 0.103MB

Explore and Impute missing values

View first 5 observations from the data set


In [7]:
tbl.head()


Out[7]:
Selected Rows from Table HMEQ
BAD LOAN MORTDUE VALUE REASON JOB YOJ DEROG DELINQ CLAGE NINQ CLNO DEBTINC
0 1.0 1100.0 25860.0 39025.0 HomeImp Other 10.5 0.0 0.0 94.366667 1.0 9.0 NaN
1 1.0 1300.0 70053.0 68400.0 HomeImp Other 7.0 0.0 2.0 121.833333 0.0 14.0 NaN
2 1.0 1500.0 13500.0 16700.0 HomeImp Other 4.0 0.0 0.0 149.466667 1.0 10.0 NaN
3 1.0 1500.0 NaN NaN NaN NaN NaN NaN NaN NaN NaN
4 0.0 1700.0 97800.0 112000.0 HomeImp Office 3.0 0.0 0.0 93.333333 0.0 14.0 NaN

View table column information


In [8]:
tbl.columninfo()


Out[8]:
§ ColumnInfo
Column ID Type RawLength FormattedLength NFL NFD
0 BAD 1 double 8 12 0 0
1 LOAN 2 double 8 12 0 0
2 MORTDUE 3 double 8 12 0 0
3 VALUE 4 double 8 12 0 0
4 REASON 5 char 7 7 0 0
5 JOB 6 char 7 7 0 0
6 YOJ 7 double 8 12 0 0
7 DEROG 8 double 8 12 0 0
8 DELINQ 9 double 8 12 0 0
9 CLAGE 10 double 8 12 0 0
10 NINQ 11 double 8 12 0 0
11 CLNO 12 double 8 12 0 0
12 DEBTINC 13 double 8 12 0 0

elapsed 0.000746s · user 0.001s · mem 0.167MB


In [9]:
tbl.shape


Out[9]:
(5960, 13)

In [10]:
tbl.describe()


Out[10]:
BAD LOAN MORTDUE VALUE YOJ DEROG DELINQ CLAGE NINQ CLNO DEBTINC
count 5960.000000 5960.000000 5442.000000 5848.000000 5445.000000 5252.000000 5380.000000 5652.000000 5450.000000 5738.000000 4693.000000
mean 0.199497 18607.969799 73760.817200 101776.048741 8.922268 0.254570 0.449442 179.766275 1.186055 21.296096 33.779915
std 0.399656 11207.480417 44457.609458 57385.775334 7.573982 0.846047 1.127266 85.810092 1.728675 10.138933 8.601746
min 0.000000 1100.000000 2063.000000 8000.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.524499
25% 0.000000 11100.000000 46268.000000 66069.000000 3.000000 0.000000 0.000000 115.103197 0.000000 15.000000 29.140031
50% 0.000000 16300.000000 65019.000000 89235.500000 7.000000 0.000000 0.000000 173.466667 1.000000 20.000000 34.818262
75% 0.000000 23300.000000 91491.000000 119831.500000 13.000000 0.000000 0.000000 231.574834 2.000000 26.000000 39.003141
max 1.000000 89900.000000 399550.000000 855909.000000 41.000000 10.000000 15.000000 1168.233561 17.000000 71.000000 203.312149

Explore data and plot missing values


In [11]:
sess.cardinality.summarize(
  table={"name":indata}, 
  cardinality={"name":"data_card", "replace":True}
)

tbl_data_card = sess.CASTable('data_card').query('_NMISS_ > 0')

print("Data Summary".center(80, '-')) # print title

df_data_card = tbl_data_card.to_frame(fetchvars=['_VARNAME_', '_NMISS_', '_NOBS_'])
df_data_card['PERCENT_MISSING'] = (df_data_card['_NMISS_'] / df_data_card['_NOBS_']) * 100

tbl_forplot = pd.Series(list(df_data_card['PERCENT_MISSING']), index=list(df_data_card['_VARNAME_']))
ax = tbl_forplot.plot(
  kind='bar', 
  title='Percentage of Missing Values',
  figsize=(11,5)
)
ax.set_ylabel('Percent Missing')
ax.set_xlabel('Variable Names');


NOTE: Writing cardinality.
NOTE: status = 0.
NOTE: The Cloud Analytic Services server processed the request in 0.000713732 seconds.
----------------------------------Data Summary----------------------------------

Impute missing values


In [12]:
sess.dataPreprocess.transform(
  table={"name":indata},
  casOut={"name":"hmeq_prepped_pr", "replace":True},
  copyAllVars=True,
  outVarsNameGlobalPrefix="IM",
  requestPackages=[
    {"impute":{"method":"MEAN"}, "inputs":{"clage"}},
    {"impute":{"method":"MEDIAN"}, "inputs":{"delinq", "debtinc", "yoj", "ninq"}},
    {"impute":{"method":"MODE"}, "inputs":{"job", "reason"}}
  ]
)


Out[12]:
§ TransInfo
Transformation Requests for HMEQ
ActualName NTransVars ImputeMethod
0 _TR1 1 Mean
1 _TR2 4 Median
2 _TR3 2 Mode

§ VarTransInfo
Variable Transformation Information for HMEQ
Variable Transformation ResultVar N NMiss ImputedValueContinuous ImputedValueNominal
0 CLAGE IM IM_CLAGE 5652 308 179.766275
1 DEBTINC IM IM_DEBTINC 4693 1267 34.818262
2 DELINQ IM IM_DELINQ 5380 580 0.000000
3 NINQ IM IM_NINQ 5450 510 1.000000
4 YOJ IM IM_YOJ 5445 515 7.000000
5 JOB IM IM_JOB 5681 279 NaN Other
6 REASON IM IM_REASON 5708 252 NaN DebtCon

§ NomVarInfo
Nominal Variable Information for HMEQ
Variable N NMiss NLevels
0 JOB 5681.0 279.0 6.0
1 REASON 5708.0 252.0 2.0

§ OutputCasTables
casLib Name Rows Columns casTable
0 CASUSER(kesmit) hmeq_prepped_pr 5960 20 CASTable('hmeq_prepped_pr', caslib='CASUSER(ke...

elapsed 0.0882s · user 0.067s · sys 0.031s · mem 34.9MB

Create new indicator variable for missing DEBTINC values


In [13]:
sess.dataStep.runcode(code = """
                 data hmeq_prepped;
                     set hmeq_prepped_pr;
                     if missing(DEBTINC) then DEBTINC_IND = 1;
                     else DEBTINC_IND = 0;
                run;    
                 """)


Out[13]:
§ InputCasTables
casLib Name Rows Columns casTable
0 CASUSER(kesmit) hmeq_prepped_pr 5960 20 CASTable('hmeq_prepped_pr', caslib='CASUSER(ke...

§ OutputCasTables
casLib Name Rows Columns casTable
0 CASUSER(kesmit) hmeq_prepped 5960 21 CASTable('hmeq_prepped', caslib='CASUSER(kesmi...

elapsed 0.121s · user 0.135s · sys 0.087s · mem 62.8MB


In [14]:
tbl_tmp = sess.CASTable("hmeq_prepped")
tbl_tmp.head()


Out[14]:
Selected Rows from Table HMEQ_PREPPED
BAD LOAN MORTDUE VALUE REASON JOB YOJ DEROG DELINQ CLAGE ... CLNO DEBTINC IM_CLAGE IM_DEBTINC IM_DELINQ IM_NINQ IM_YOJ IM_JOB IM_REASON DEBTINC_IND
0 1.0 1100.0 25860.0 39025.0 HomeImp Other 10.5 0.0 0.0 94.366667 ... 9.0 NaN 94.366667 34.818262 0.0 1.0 10.5 Other HomeImp 1.0
1 1.0 1300.0 70053.0 68400.0 HomeImp Other 7.0 0.0 2.0 121.833333 ... 14.0 NaN 121.833333 34.818262 2.0 0.0 7.0 Other HomeImp 1.0
2 1.0 1500.0 13500.0 16700.0 HomeImp Other 4.0 0.0 0.0 149.466667 ... 10.0 NaN 149.466667 34.818262 0.0 1.0 4.0 Other HomeImp 1.0
3 1.0 1500.0 NaN NaN NaN NaN NaN NaN ... NaN NaN 179.766275 34.818262 0.0 1.0 7.0 Other DebtCon 1.0
4 0.0 1700.0 97800.0 112000.0 HomeImp Office 3.0 0.0 0.0 93.333333 ... 14.0 NaN 93.333333 34.818262 0.0 0.0 3.0 Office HomeImp 1.0

5 rows × 21 columns

Set variables for input data


In [15]:
target          = "bad"
class_inputs    = ["im_reason", "im_job", "debtinc_ind"]
class_vars      = [target] + class_inputs
interval_inputs = ["im_clage", "clno", "im_debtinc", "loan", "mortdue", "value", "im_yoj", "im_ninq", "derog", "im_delinq"]
all_inputs      = interval_inputs + class_inputs

Partition data into Training and Validation


In [16]:
sess.sampling.stratified(
  table={"name":"hmeq_prepped", "groupBy":"bad"}, 
  output={"casOut":{"name":"hmeq_part", "replace":True}, "copyVars":"ALL"},
  samppct=70,
  partind=True
)


NOTE: Using SEED=456623202 for sampling.
Out[16]:
§ outputSize
{'outputNObs': 5960.0, 'outputNVars': 22}

§ STRAFreq
Frequencies
ByGrpID BAD NObs NSamp
0 0 0 4771 3340
1 1 1 1189 832

§ OutputCasTables
casLib Name Label Rows Columns casTable
0 CASUSER(kesmit) hmeq_part 5960 22 CASTable('hmeq_part', caslib='CASUSER(kesmit)')

elapsed 0.0854s · user 0.074s · sys 0.029s · mem 36.9MB

Random Forest


In [17]:
sess.help(actionset="decisionTree");


NOTE: Information for action set 'decisionTree':
NOTE:    decisionTree
NOTE:       dtreeTrain - Train a decision tree
NOTE:       dtreeScore - Score a table using a decision tree model
NOTE:       dtreeSplit - Split decision tree nodes
NOTE:       dtreePrune - Prune a decision tree
NOTE:       dtreeMerge - Merge decision tree nodes
NOTE:       dtreeCode - Generate DATA step scoring code from a decision tree model
NOTE:       forestTrain - Train a forest
NOTE:       forestScore - Score a table using a forest model
NOTE:       forestCode - Generate DATA step scoring code from a forest model
NOTE:       gbtreeTrain - Train a gradient boosting tree
NOTE:       gbtreeScore - Score a table using a gradient boosting tree model
NOTE:       gbtreecode - Generate DATA step scoring code from a gradient boosting tree model

In [18]:
rf = sess.decisionTree.forestTrain(
  table={
    "name":"hmeq_part",
    "where":"strip(put(_partind_, best.))='1'"
  },
  inputs=all_inputs,
  nominals=class_vars,
  target="bad",
  nTree=20,
  nBins=20,
  leafSize=5,
  maxLevel=21,
  crit="GAINRATIO",
  varImp=True,
  seed=100,
  OOB=True,
  vote="PROB",
  casOut={"name":"forest_model", "replace":True}
)

# Output model statistics
render_html(rf)

# Score 
sess.decisionTree.forestScore(
  table={"name":"hmeq_part"},
  modelTable={"name":"forest_model"},
  casOut={"name":"_scored_rf", "replace":True},
  copyVars={"bad", "_partind_"},
  vote="PROB"
)

# Create p_bad0 and p_bad1 as _rf_predp_ is the probability of event in _rf_predname_
sess.dataStep.runCode(
  code="""data _scored_rf; set _scored_rf; if _rf_predname_=1 then do; p_bad1=_rf_predp_; 
    p_bad0=1-p_bad1; end; if _rf_predname_=0 then do; p_bad0=_rf_predp_; p_bad1=1-p_bad0; end; run;"""
)


Forest for HMEQ_PART
Descr Value
Number of Trees20
Number of Selected Variables (M)4
Random Number Seed100
Bootstrap Percentage (%)63.212055882
Number of Bins20
Number of Variables13
Confidence Level for Pruning0.25
Max Number of Tree Nodes291
Min Number of Tree Nodes191
Max Number of Branches2
Min Number of Branches2
Max Number of Levels21
Min Number of Levels21
Max Number of Leaves146
Min Number of Leaves96
Maximum Size of Leaves1137
Minimum Size of Leaves5
Out-of-Bag MCR (%)10.498561841
OOB Error With Forest Analytics for HMEQ_PART
TreeID Trees NLeaves MCR LogLoss ASE RASE MAXAE
011200.13701067621.10365308420.10394760150.32240905921
122430.13177115031.20200649750.17969346040.42390265441
233580.12506652471.06480056240.21651709930.46531397931
344700.12025157230.99608941970.24220908180.49214741881
456140.11722956440.9962004280.2635789470.5133994031
567470.11608222490.97205015360.27341963020.52289542951
678780.1107099880.95879317650.28172425590.53077703031
7810050.10999039390.94843851270.2869697980.53569562071
8911430.10895128390.93021116440.29011333790.5386217021
91012390.10959232610.90960485410.29239988020.54074012261
101113670.10863309350.90463128340.29472093440.5428820631
111214970.10453128750.89498520460.29739667670.54534088121
121316430.10381203550.88628043050.29902080190.54682794541
131417790.10477103810.89086713650.30139094770.54899084491
141518990.1045062320.88471205040.30285019360.55031826581
151620360.10426653880.8845453790.30411463430.55146589591
161721640.10474592520.88422201380.30469250350.55198958641
171822860.10594439120.88511969130.30506702920.55232873291
181924140.10426653880.88437600180.30512613110.55238223281
192025460.10498561840.88676636620.30631406690.55345647251
Forest for HMEQ_PART
Analysis Variable Importance Std
DEBTINC_IND243.91395118112.02126828
IM_DELINQ52.3629279377.7363978972
IM_DEBTINC45.83781436210.117386013
DEROG32.637919896.5329884971
IM_CLAGE25.6434087032.7750146773
IM_JOB24.5283076792.2762677724
IM_NINQ23.4238211782.4255115085
IM_YOJ22.7586471681.6983972402
CLNO22.2637192832.0284651304
MORTDUE21.2411692211.7538408033
LOAN20.6518557793.6265888785
VALUE18.3456123562.1351707007
IM_REASON5.97780815471.5981960912
Output CAS Tables
CAS Library Name Number of Rows Number of Columns Table
CASUSER(kesmit)forest_model507239CASTable('forest_model', caslib='CASUSER(kesmit)')
NOTE: Character values have been converted to numeric values at the places given by: (Line):(Column).
      0:35    0:109
NOTE: Duplicate messages output by DATA step:
NOTE: Character values have been converted to numeric values at the places given by: (Line):(Column).  (occurred 32 times)
      0:35    0:109  (occurred 32 times)
Out[18]:
§ InputCasTables
casLib Name Rows Columns casTable
0 CASUSER(kesmit) _scored_rf 5960 7 CASTable('_scored_rf', caslib='CASUSER(kesmit)')

§ OutputCasTables
casLib Name Rows Columns casTable
0 CASUSER(kesmit) _scored_rf 5960 9 CASTable('_scored_rf', caslib='CASUSER(kesmit)')

elapsed 0.126s · user 0.145s · sys 0.115s · mem 67.8MB


In [19]:
list(rf.keys())


Out[19]:
['ModelInfo', 'ErrorMetricInfo', 'DTreeVarImpInfo', 'OutputCasTables']

In [20]:
rf['DTreeVarImpInfo']


Out[20]:
Forest for HMEQ_PART
Variable Importance Std
0 DEBTINC_IND 243.913951 112.021268
1 IM_DELINQ 52.362928 7.736398
2 IM_DEBTINC 45.837814 10.117386
3 DEROG 32.637920 6.532988
4 IM_CLAGE 25.643409 2.775015
5 IM_JOB 24.528308 2.276268
6 IM_NINQ 23.423821 2.425512
7 IM_YOJ 22.758647 1.698397
8 CLNO 22.263719 2.028465
9 MORTDUE 21.241169 1.753841
10 LOAN 20.651856 3.626589
11 VALUE 18.345612 2.135171
12 IM_REASON 5.977808 1.598196

Gradient Boosting


In [21]:
gb = sess.decisionTree.gbtreeTrain(
  table={
    "name":"hmeq_part",
    "where":"strip(put(_partind_, best.))='1'"
  },
  inputs=all_inputs,
  nominals=class_vars,
  target="bad",
  nTree=10,
  nBins=20,
  maxLevel=6,
  varImp=True,
  casOut={"name":"gb_model", "replace":True}
)

# Output model statistics
render_html(gb)

# Score 
sess.decisionTree.gbtreeScore(
  table={"name":"hmeq_part"},
  modelTable={"name":"gb_model"},
  casOut={"name":"_scored_gb", "replace":True},
  copyVars={"bad", "_partind_"}
)

# Create p_bad0 and p_bad1 as _gbt_predp_ is the probability of event in _gbt_predname_
sess.dataStep.runCode(
  code="""data _scored_gb; set _scored_gb; if _gbt_predname_=1 then do; p_bad1=_gbt_predp_; 
    p_bad0=1-p_bad1; end; if _gbt_predname_=0 then do; p_bad0=_gbt_predp_; p_bad1=1-p_bad0; end; run;"""
)


Gradient Boosting Tree for HMEQ_PART
Descr Value
Number of Trees10
Distribution2
Learning Rate0.1
Subsampling Rate0.5
Number of Selected Variables (M)13
Number of Bins20
Number of Variables13
Max Number of Tree Nodes57
Min Number of Tree Nodes47
Max Number of Branches2
Min Number of Branches2
Max Number of Levels6
Min Number of Levels6
Max Number of Leaves29
Min Number of Leaves24
Maximum Size of Leaves1440
Minimum Size of Leaves5
Random Number Seed0
Decision Tree for HMEQ_PART
Analysis Variable Importance Std
DEBTINC_IND130.8470193496.207964657
IM_DEBTINC29.87752871919.711368144
IM_DELINQ20.3601165.3338895179
IM_CLAGE10.356662953.9136317203
DEROG8.18584475355.5348823741
CLNO7.3268555232.0332162503
VALUE6.71616123172.2283466493
IM_JOB6.59876077532.104645775
IM_YOJ6.00005297272.1595597084
MORTDUE4.61421627622.1234128609
IM_NINQ4.61080440892.5487098494
LOAN2.7928624031.3021276913
IM_REASON1.14898274911.9710703771
Output CAS Tables
CAS Library Name Number of Rows Number of Columns Table
CASUSER(kesmit)gb_model50829CASTable('gb_model', caslib='CASUSER(kesmit)')
NOTE: Character values have been converted to numeric values at the places given by: (Line):(Column).
      0:35    0:111
NOTE: Duplicate messages output by DATA step:
NOTE: Character values have been converted to numeric values at the places given by: (Line):(Column).  (occurred 32 times)
      0:35    0:111  (occurred 32 times)
Out[21]:
§ InputCasTables
casLib Name Rows Columns casTable
0 CASUSER(kesmit) _scored_gb 5960 6 CASTable('_scored_gb', caslib='CASUSER(kesmit)')

§ OutputCasTables
casLib Name Rows Columns casTable
0 CASUSER(kesmit) _scored_gb 5960 8 CASTable('_scored_gb', caslib='CASUSER(kesmit)')

elapsed 0.126s · user 0.149s · sys 0.103s · mem 65.9MB

Neural Network


In [22]:
nn = sess.neuralNet.annTrain(
  table={
    "name":"hmeq_part",
    "where":"strip(put(_partind_, best.))='1'"
  },
  validTable={
    "name":"hmeq_part",
    "where":"strip(put(_partind_, best.))='0'"
  },
  inputs=all_inputs,
  nominals=class_vars,
  target="bad",
  hiddens={2},
  acts={"TANH"},
  combs={"LINEAR"},
  targetAct="SOFTMAX",
  errorFunc="ENTROPY",
  std="MIDRANGE",
  randDist="UNIFORM",
  scaleInit=1,
  nloOpts={
    "optmlOpt":{"maxIters":250, "fConv":1e-10}, 
    "lbfgsOpt":{"numCorrections":6},
    "printOpt":{"printLevel":"printDetail"},
    "validate":{"frequency":1}
  },
  casOut={"name":"nnet_model", "replace":True}
)

# Output model statistics
render_html(nn)

# Score 
sess.neuralNet.annScore(
  table={"name":"hmeq_part"},
  modelTable={"name":"nnet_model"},
  casOut={"name":"_scored_nn", "replace":True},
  copyVars={"bad", "_partind_"}
)

# Create p_bad0 and p_bad1 as _nn_predp_ is the probability of event in _nn_predname_
sess.dataStep.runCode(
  code="""data _scored_nn; set _scored_nn; if _nn_predname_=1 then do; p_bad1=_nn_predp_; 
    p_bad0=1-p_bad1; end; if _nn_predname_=0 then do; p_bad0=_nn_predp_; p_bad1=1-p_bad0; end; run;"""
)


Iteration History
Progress Objective Loss Validation Error Step Size L1 Norm L2 Norm MAX Norm Gradient Norm
12.83067929052.83067929050.793893129801.89454268950.5938635310.4358570662.3599954451
21.98550904691.98550904690.19569743230.42372963144.819143940.93728320720.39956474230.2715344786
31.9649560831.9649560830.195697432315.01897261941.00395009050.46803802510.1117931128
41.42898059351.42898059350.163081193622.4756492157.8662015582.2446220731.19959245320.1325133816
51.42361507191.42361507190.16377515610.01400423098.23308119062.32392579011.24171504320.1212720735
61.40820489741.40820489740.163775156119.18533397192.43181106291.30456084020.1534639835
71.36149885191.36149885190.15405968081.768714448111.7092457922.60074070951.37016736910.2518555739
81.34222521951.34222521950.15475364330.385232914714.1516792412.91909055191.46338266390.2020863318
91.3389331291.3389331290.1568355309116.8763719733.36783066381.62789106980.4367216567
101.31968315511.31968315510.1526717557116.287037783.26931797171.59197386860.1150605532
111.31455331441.31455331440.1540596808116.8592252343.35557002811.61155726430.1116593891
121.30561344241.30561344240.1540596808117.7655727033.50322872791.64502826410.188426248
131.28743408191.28743408190.146426093119.3621644673.79544307061.69334165170.2972490646
141.26867338131.26867338130.1408743928119.875894123.93610470081.67394566040.2479141365
151.26812414221.26812414220.1318528799123.0056179484.57507684361.7930369290.2272274263
161.23259134931.23259134930.1353226926121.3890987924.33274440361.7013886980.1200595911
171.21179165011.21179165010.1353226926120.2987506984.22235014421.63954727970.1198924277
181.20314946671.20314946670.1304649549120.1379625634.36452615711.62354545430.224537897
191.19263401131.19263401130.1339347675121.1614952824.62963366581.67742888850.0822750185
201.18755698921.18755698920.1353226926121.9081675244.80779645421.73999184810.0797552915
211.17956415711.17956415710.1304649549124.0930367145.37332323791.86176821240.2350467786
221.17132082311.17132082310.1339347675124.3447745985.41438922351.90071439960.0624618759
231.16755328251.16755328250.1304649549124.549985285.44500394921.91642096920.0771357954
241.16448318251.16448318250.1290770298124.4101374315.40838155261.90812449730.0827288638
251.1596503541.1596503540.1290770298125.160371355.6086425481.91997074610.1855494188
261.15709210541.15709210540.13115891740.387318143925.3141236655.62919700941.93941185960.0633974226
271.15600025321.15600025320.1297709924125.8173305775.7493900461.96962009230.0586019494
281.15243822561.15243822560.1269951423126.0948838165.81832127821.98815069630.0388051379
291.15069254451.15069254450.12699514230.43372037125.9722899375.7747327631.99331310650.0627875149
301.1491877311.1491877310.1276891048126.4726857065.90930479822.01979191090.0384474037
311.14832915161.14832915160.1283830673126.7403198545.97492136842.03288152480.0411788925
321.14553213681.14553213680.1283830673127.7685736916.21628807552.08038139080.041177449
331.14519055931.14519055930.12907702980.030763860827.7758102076.2076652682.08078398170.0864014464
341.14289499721.14289499720.1276891048128.7781635336.44053988412.23513264670.0623418842
351.13924494771.13924494770.1256072172129.427451556.57744705822.39458339380.0533933355
361.13342743481.13342743480.1235253296130.7014783826.84406897462.82592428210.0820912466
371.13156325351.13156325350.1221374046131.709240287.11354641063.05862078470.0615613106
381.12860739331.12860739330.1249132547131.4831728997.0531401433.04019198930.030212554
391.12686036061.12686036060.1242192922131.9047190137.15098252863.17668198170.0201048545
401.12669611481.12669611480.12560721720.113995299132.0378365397.18597896013.20182345280.0421167135
411.12617890521.12617890520.1256072172132.2677831047.23673998633.25496887390.0304030622
421.12518001021.12518001020.1269951423132.6366347267.31193907173.32238727250.0201760498
431.12366073731.12366073730.1269951423133.5127188377.49978426913.45860401190.0202483877
441.12285312761.12285312760.1269951423134.1745801717.65250926473.53267777440.0398116073
451.12244022571.12244022570.12491325470.413028075434.4454912737.70339058733.55655052960.0161880513
461.12220182891.12220182890.1249132547134.4635922057.70856549923.54476407590.0166884834
471.12122052541.12122052540.1235253296135.1071694257.84126888783.58190133180.0177978432
481.12095551281.12095551280.12421929220.411548907735.2390696417.86467942053.59328960470.0530172611
491.12054715981.12054715980.1228313671135.3766356237.88746939383.61341005750.0362646594
501.11975491861.11975491860.1214434421135.9842539268.00069810653.7028853210.0135972754
511.11941971051.11941971050.1228313671136.3930935688.0820324953.77070218770.0178495677
521.11895474221.11895474220.1242192922136.9303721588.19211425063.86167162030.0236267529
531.1184523251.1184523250.1249132547138.0176625988.4213871984.06207576580.0301420446
541.11765097111.11765097110.1235253296138.605231578.54949563774.15842682640.0212934972
551.11734138381.11734138380.1228313671138.8518772048.59757532264.19448320130.0202398597
561.11698577371.11698577370.1228313671139.2536751168.68032519164.2671140320.0202105499
571.11610862221.11610862220.1214434421140.1606538498.85975447254.40829442830.0323630433
581.11507630481.11507630480.1207494795141.7480709129.1853923114.65764987560.0312566542
591.11463410511.11463410510.120055517141.9361416149.22639979234.68681384510.015349982
601.11434516571.11434516570.1207494795142.0427217539.25193724784.70493365750.014982895
611.1137666971.1137666970.1214434421142.4719700229.35343906714.78486396140.0280714442
621.11317736561.11317736560.1179736294142.358833029.35793640164.79136495630.0222825383
631.11263543921.11263543920.118667592142.7750284619.45778330534.87285536340.025489292
641.11217941731.11217941730.1193615545143.1217727899.53489433494.93697925210.0206818307
651.11138402031.11138402030.1193615545144.1189145929.74448506575.09982638150.017155542
661.10992402141.10992402140.1214434421146.56082029710.2510275645.48475729160.0242271807
671.10947228751.10947228750.12074947950.253523041848.12026934710.5746137425.7236438070.0396096381
681.10891941611.10891941610.120055517148.62606125410.6838852625.80268792810.0274818335
691.1080830641.1080830640.1235253296149.44830538610.8688304935.93697422390.0157473926
701.10762186461.10762186460.1235253296150.04307597811.0085142356.0418100660.018456217
711.10658671731.10658671730.1235253296150.95736906511.2506682116.23224210750.0182295352
721.10619359621.10619359620.12352532960.149575618852.06991576711.5060924576.42702863860.0378085814
731.10556808181.10556808180.1214434421152.32599389811.5971561646.50755751870.0211692853
741.10502530721.10502530720.1207494795152.53101981511.6762338966.58280393430.0177051237
751.10448917721.10448917720.1193615545152.58212733311.7238764976.63891085710.019243706
761.10416510921.10416510920.1186675920.22934392653.57695092311.9593900036.827031360.0505155629
771.10328596471.10328596470.118667592154.21134427512.1479801356.99484250960.0378075149
781.10184831131.10184831130.118667592156.12155981412.6581492017.42323936030.0182202453
791.10092099561.10092099560.1179736294158.44398693913.2550635647.90557799510.0213669545
801.10036793361.10036793360.118667592160.12794176213.7150430918.28840142470.0241556062
811.09971583031.09971583030.1214434421161.73377568214.1080175768.60506437530.0268375141
821.09901337281.09901337280.1221374046161.66990431814.109678588.62469660910.0251443313
831.09821910171.09821910170.12560721720.185290119863.42608368614.6094902919.05852614720.0335981194
841.09651590521.09651590520.1256072172162.61142434214.502015419.07432836050.0270197654
851.09570959081.09570959080.12560721720.418931291364.46586567115.0751106089.57955966340.0209186386
861.0950548761.0950548760.1256072172163.12865737614.7582390059.345682730.0239891302
871.09423731111.09423731110.12074947950.46575266462.89575667614.763050799.39822337310.0192451867
881.09341310731.09341310730.1207494795162.43685120114.7173032059.41786396820.0144057687
891.09258915411.09258915410.1207494795162.65245980214.8521887859.59099074050.0212367368
901.09239566811.09239566810.1221374046164.94495174215.56827853110.2475365160.0507534155
911.09161128661.09161128660.1221374046164.79229831515.49608495610.1742170870.0194430115
921.09112245061.09112245060.1193615545165.653119115.72573830510.3650871740.0154700627
931.09074576921.09074576920.120055517167.03709183116.11168502410.6925915120.0192163503
941.0902532711.0902532710.118667592167.78872842216.33071330910.890432950.0147714236
951.08963642881.08963642880.120055517170.65710046717.23445341811.7175236470.0242674721
961.0889001681.0889001680.120055517171.33409442817.46583319211.9379616420.0137034226
971.08816210521.08816210520.118667592172.10872826117.7601959412.2349640430.0134885615
981.08766698921.08766698920.1172796669172.6182580317.94057997212.4176058330.0131880785
991.08746895571.08746895570.11658570440.206596776373.13460312918.1003532712.5711663650.0271752636
1001.08688462051.08688462050.1179736294173.79320117218.31610733712.7918096230.0128924439
1011.08653866531.08653866530.1179736294174.2238108918.43099070912.905860810.0090642244
1021.08600550781.08600550780.1193615545174.78334314818.55422173313.0337896070.0103924747
1031.08557925121.08557925120.118667592175.01142011218.60211316213.0932046520.0097625451
1041.08520305531.08520305530.11936155450.510531588975.1870774818.64233278613.1504191810.0183730363
1051.08467206451.08467206450.118667592175.61161182418.79706170913.3190148430.0065985679
1061.08448114331.08448114330.1193615545175.78061740318.8839388413.4097559060.0135927103
1071.08423605431.08423605430.1179736294175.95341431619.01442807713.5505230310.00931953
1081.08404911041.08404911040.1179736294176.14012005819.09815342413.6360755360.0108141507
1091.083639381.083639380.118667592176.77036739419.34952937613.8948321240.0166278278
1101.08316954251.08316954250.1193615545177.75710594819.70452880214.2612288890.0142947982
1111.08256357211.08256357210.1158917418179.45818282720.33763673614.9114901810.0131816066
1121.08207374841.08207374840.118667592180.60917141420.77671276515.3602716350.0123530064
1131.08197786011.08197786010.1214434421180.96801482620.9042725715.4907550.0441911003
1141.08156732541.08156732540.118667592180.95544363620.9096969215.4980860580.0146671043
1151.08138952791.08138952790.118667592181.03612116120.94742008715.5371186460.0084914871
1161.08123185151.08123185150.1193615545181.23877649921.02991401815.6202930070.0129230271
1171.08103228291.08103228290.1193615545181.59374872821.17384875515.7647572970.014272826
1181.08073546261.08073546260.1207494795182.94543661521.71384040716.2991704530.0197560694
1191.08030559981.08030559980.1207494795183.64948776222.00146479216.5853462240.0100819353
1201.08006586631.08006586630.120055517183.77348238822.03356550916.6170216080.0073371499
1211.0796647151.0796647150.1172796669184.68080818522.33853415316.9168123340.0084928013
1221.07958966811.07958966810.11797362940.156099189684.85620372622.39449168816.9723174950.015742496
1231.07943945651.07943945650.1179736294185.38845575322.58009706617.1545044720.0120216034
1241.07913008231.07913008230.118667592186.33415301422.90692749117.4767011620.0059738776
1251.07895025551.07895025550.118667592186.86744976423.09061980317.658851220.0060311913
1261.07866587391.07866587390.1179736294187.53081593123.30690020217.8742998620.0074243033
1271.07819829911.07819829910.1179736294189.19060575523.85979181218.4183301380.0150209623
1281.07792650471.07792650470.1179736294190.81185044924.38537569918.9360921570.0164041784
1291.07758943461.07758943460.1179736294190.43529249624.2610800318.8137130330.0077241511
1301.07743444641.07743444640.1193615545190.19780789824.1766764718.7310827960.0133844335
1311.07732658031.07732658030.1193615545190.49554610524.28037322618.8320899540.0140756783
1321.0772534271.0772534270.118667592190.68199765524.34338014118.8938016540.0108595161
1331.07698784811.07698784810.1179736294191.65368956724.6784203119.2194274420.008518817
1341.07680010251.07680010250.1179736294192.2219492524.88119421819.4145219860.0105363163
1351.07655731331.07655731330.118667592193.2169091725.23176968619.7478388640.0060759507
1361.0763973861.0763973860.118667592193.73010515125.40807010719.9111049020.0052855846
1371.07632991091.07632991090.118667592194.45623394625.65043905320.1326779290.0107482087
1381.07625682551.07625682550.1186675920.289828800894.33239084825.60427564720.087114950.0136405126
1391.07611033931.07611033930.1179736294193.94896198525.46800942919.958623470.0062208391
1401.07602895291.07602895290.1179736294193.90441344725.44935993819.9379210580.0045321258
1411.07589223551.07589223550.1172796669194.05880307325.4946028419.9747769150.0063769498
1421.07574671351.07574671350.1172796669194.55087853425.64575047920.1095324620.0092576743
1431.07564126461.07564126460.1172796669195.35854803925.89152811220.3330389720.0094449236
1441.07554316181.07554316180.1172796669195.69566948826.00562718620.4388660650.005723025
1451.07540029251.07540029250.1172796669196.34931054526.22196477220.6395376850.0090230208
1461.07527923011.07527923010.1179736294197.74862603426.68707054721.0721141830.013092254
1471.07516244011.07516244010.118667592198.09673987826.79204032221.168019960.0067177884
1481.07506471191.07506471190.118667592198.64848767626.9569431421.3177029150.0072626988
1491.0749863911.0749863910.1179736294199.16253088327.10847477221.4553878640.0067782533
1501.07481564221.07481564220.11797362941100.7529745727.58896644621.8925699040.00886988
1511.0747489251.0747489250.11797362940.2824589605101.1099882927.69373624421.9848242780.0054392713
1521.07467236611.07467236610.11797362941101.8257443127.91634839122.1873112860.0052205601
1531.0745463081.0745463080.11797362941102.5791388928.15472535822.4020558590.004929035
1541.07440730771.07440730770.11727966691103.6965403728.51193743722.7230988750.0096464965
1551.07430080641.07430080640.1186675921106.5330041929.40877440123.5212776310.0173108183
1561.07411670321.07411670320.11936155451106.2743102329.33090392923.4511978780.0060496262
1571.0740134241.0740134240.11936155451106.4669790129.39317970223.5045537160.0070555221
1581.07388388271.07388388270.11936155451107.2500134329.64106744123.722060330.007249838
1591.07379494251.07379494250.1186675920.199659117108.6057574530.06759097224.0957439830.0182033873
1601.07357738651.07357738650.11658570441110.2450399230.58253199524.5449820450.0112951129
1611.07338364351.07338364350.11727966691111.7095817831.03898660724.941592880.007717029
1621.07304998061.07304998060.11797362941114.4048269331.87951781925.668287940.0103667779
1631.07271201141.07271201140.11658570441116.9433499932.67214121226.3498829860.0127546704
1641.07247118571.07247118570.11727966690.3853234795119.4035006833.44252173827.0085375980.0268554292
1651.07199753521.07199753520.11589174181121.5145618134.1012163927.5680183360.0163211211
1661.07157808811.07157808810.11727966691123.6317019834.76276560728.1266189460.0094378161
1671.07136144061.07136144060.11727966691123.9403544134.85126767528.1962518590.0090195391
1681.07117994471.07117994470.11658570441126.4163210535.61567614828.8377378650.0301169762
1691.07091048351.07091048350.11727966691126.6596644135.68544102828.895853280.0147609105
1701.07068407341.07068407340.11658570441127.8728717336.05230174529.204021750.0090539247
1711.07059205251.07059205250.11658570441128.5836813836.26874848829.3856604070.0103982387
1721.07011497031.07011497030.11658570441130.6257302836.8742722729.8900872360.0129560372
1731.06976994251.06976994250.11519777931136.019371338.48612629131.227796810.0247925396
1741.06939708511.06939708510.11519777931137.5118497938.89898450131.5641780960.0136800603
1751.06911217931.06911217930.11519777931136.3705233938.56441896331.2861893850.0075899747
1761.06885063731.06885063730.11589174181136.6126459638.64271769731.3480765390.0076932825
1771.06855891661.06855891660.11797362941138.0726905439.07540914931.6938969330.0276366787
1781.06807546151.06807546150.11589174181141.1069635639.98774166832.4407992670.0132416552
1791.06773813261.06773813260.11797362941143.9205615940.82918783933.1285036970.0150043175
1801.06726289081.06726289080.11658570441149.0638037642.36974296434.3835139450.0129940514
1811.0667624371.0667624370.11797362941153.7451681943.77294294935.5206203040.0091587979
1821.06624278871.06624278870.11727966691161.7521594246.03500300637.3366559950.0130977637
1831.06577523081.06577523080.11589174181166.7626750347.45483951438.4845027120.0182924941
1841.06538770521.06538770520.11727966691165.110310146.9737124738.0945579330.0089728407
1851.06500192821.06500192820.1186675921163.4367275746.47478445837.6878132580.008490139
1861.06486393721.06486393720.11589174180.2486925749165.6433807747.09276111838.182291210.0189485338
1871.06460331311.06460331310.11658570441165.5398332547.03929585738.1342584250.0137958259
1881.06412872931.06412872930.11519777931168.4549960447.82935169538.7621533950.0087985596
1891.06347761441.06347761440.11450381681175.9141741149.89559565240.4097643920.0073596606
1901.0630925361.0630925360.11450381681181.0307061351.32445433341.5483571570.0083111847
1911.06245626871.06245626870.11589174181185.2413844252.46680559442.4507526930.0076882482
1921.06217271731.06217271730.11658570440.3358743411189.6190870853.69548249643.4271903460.0101319946
1931.06189563491.06189563490.11589174181189.6847513853.7034657543.4324329310.0054005792
1941.06166572051.06166572050.11519777931189.6815769353.69606983943.4275159250.0052021428
1951.06161255161.06161255160.11658570440.1196729158190.6705557453.9742657343.6486520420.0122856079
1961.06145813871.06145813870.11797362941191.5949595754.23420578343.8574436060.0103412688
1971.06107479231.06107479230.11658570441199.0703212656.31032668645.5058805480.008967209
1981.06081735431.06081735430.11311589171204.0135638357.70395864346.6150522130.0060587964
1991.06067002461.06067002460.11519777931206.2329302558.33177415247.1130802980.0050586675
2001.06038561511.06038561510.11519777931211.6962526659.88958070548.352556510.0048557745
2011.06026986851.06026986850.11450381681217.3324250961.50293812349.6353874190.0145062162
2021.06001236721.06001236720.11519777931217.870928461.65887552449.7610788950.0098324397
2031.05984337681.05984337680.11519777931217.4107055961.53806977649.6700593840.0057086268
2041.05970523131.05970523130.11658570441218.6696974961.90944551649.9688350630.0064414462
2051.05945510651.05945510650.11658570441222.4439109663.00406625450.8439095570.006557792
2061.05870439541.05870439540.11658570441232.0727869665.75108496453.0382221620.0106921403
2071.0585079181.0585079180.11242192921245.5974527869.61163246156.1147669340.0168753549
2081.05807818531.05807818530.11450381681243.7630515169.09196806655.700270980.0054800495
2091.05793566631.05793566630.11450381681244.2884580869.24767913255.8237323890.0047790483
2101.05759594921.05759594920.11519777931248.5867059670.49799066556.8187531310.0083685379
2111.05744461271.05744461270.11380985430.431908278252.5721207571.66017161757.7430438540.0145275732
2121.05718502361.05718502360.11589174181258.8061626773.46617191159.1796759680.0081604309
2131.05699435841.05699435840.11519777931265.1427164575.30258261860.6398439040.0053962176
2141.05681227621.05681227620.11589174181271.2039512777.06394613962.0397706750.0061580178
2151.05648769031.05648769030.11519777931282.2468895980.28339731964.5969572230.0073198846
2161.05596470161.05596470160.11519777931302.002909186.07789033169.1949158440.0087122412
2171.05579008581.05579008580.11658570441340.9812988697.54583392978.2912493580.0169191745
2181.05516985541.05516985540.11450381681340.3986384697.39177928178.1678512060.0062139819
2191.05495635221.05495635220.11589174181340.8106532397.53338518978.2816900060.0093934842
2201.05469430081.05469430080.11658570441344.6665362298.70009755679.2090335810.0074355019
2211.0543667971.0543667970.11519777931352.22446322100.9611491481.0060210450.0070201011
2221.05346691.05346690.11450381681384.56769548110.5466097488.6093046420.0067499748
2231.05289309461.05289309460.11797362941429.36667231123.8246557399.1396081070.0182890557
2241.05223272251.05223272250.11519777931443.53141504127.95909021102.414462070.0057400982
2251.05207667021.05207667020.11450381681447.33755549129.05780285103.284998790.0049097214
2261.05186740211.05186740210.11450381681452.54774307130.55963825104.475973390.0054866439
2271.05172438821.05172438820.11727966690.3605291862457.00028988131.84871363105.499317920.0134448442
2281.05142003841.05142003840.11727966691460.76212753132.93082447106.358578470.009160309
2291.05098669721.05098669720.11658570441466.07393819134.47112854107.582556590.0044225595
2301.05078494381.05078494380.11658570441468.22160695135.11023188108.090467590.0040809245
2311.05054468041.05054468040.11658570441470.85515569135.91560326108.73092790.0043671188
2321.05041972481.05041972480.11658570440.5101931319473.42048941136.69680296109.352190810.0063443046
2331.05024878411.05024878410.11658570441476.08788284137.50479041109.994182140.0047292739
2341.04985326211.04985326210.11589174181483.96787221139.89274194111.889873730.0057316327
2351.04966057451.04966057450.11380985431499.07294622144.4109734115.474858810.0242000931
2361.04916883081.04916883080.11380985431504.60476995146.08308645116.800461850.0104163181
2371.04884851841.04884851840.11450381681510.67135336147.89329893118.234964730.006315519
2381.04864195961.04864195960.11450381681517.66007319149.97506016119.885045170.0062612822
2391.04824601971.04824601970.11589174181527.95723221153.03771383122.312760290.0064551417
2401.04787953791.04787953790.11589174180.42508166542.5981749157.40990262125.781009750.0114716348
2411.04745502711.04745502710.11380985431552.13218944160.25579879128.037846380.0045537498
2421.04726966791.04726966790.11450381681553.32028857160.60908513128.319739830.0035463213
2431.04706876321.04706876320.11519777931556.27196539161.48514628129.017454030.0042124361
2441.04688373631.04688373630.11519777931571.29987806165.91684644132.538863820.0189431328
2451.04645585171.04645585170.11519777931577.25029831167.6906983133.947771410.0087113444
2461.04625943281.04625943280.11519777931577.88146169167.87230357134.09260680.0048831722
2471.04606197341.04606197340.11519777931584.28133517169.75749474135.589858890.0036941483
2481.04582822461.04582822460.11519777931592.62524125172.22355937137.549428930.0046956524
2491.04530295081.04530295080.11589174181614.71434532178.75342997142.738992210.007579653
2501.04511471931.04511471930.11589174180.327835453625.75225902182.02376208145.337817480.0086152641
Convergence Status
Convergence Status
The optimization exited on maximum iterations.
Neural Net Model Info for HMEQ_PART
Descr Value
ModelNeural Net
Number of Observations Used3381
Number of Observations Read4172
Target/Response VariableBAD
Number of Nodes24
Number of Input Nodes20
Number of Output Nodes2
Number of Hidden Nodes2
Number of Hidden Layers1
Number of Weight Parameters42
Number of Bias Parameters4
ArchitectureMLP
Number of Neural Nets1
Objective Value1.0451147193
Misclassification Error for Validation (%)11.589174185
Output CAS Tables
CAS Library Name Number of Rows Number of Columns Table
CASUSER(kesmit)nnet_model4415CASTable('nnet_model', caslib='CASUSER(kesmit)')
NOTE: Character values have been converted to numeric values at the places given by: (Line):(Column).
      0:35    0:109
NOTE: Duplicate messages output by DATA step:
NOTE: Character values have been converted to numeric values at the places given by: (Line):(Column).  (occurred 32 times)
      0:35    0:109  (occurred 32 times)
Out[22]:
§ InputCasTables
casLib Name Rows Columns casTable
0 CASUSER(kesmit) _scored_nn 5960 4 CASTable('_scored_nn', caslib='CASUSER(kesmit)')

§ OutputCasTables
casLib Name Rows Columns casTable
0 CASUSER(kesmit) _scored_nn 5960 6 CASTable('_scored_nn', caslib='CASUSER(kesmit)')

elapsed 0.126s · user 0.146s · sys 0.1s · mem 66.1MB

Support Vector Machine


In [23]:
sv = sess.svm.svmTrain(
  table={
    "name":"hmeq_part", 
    "where":"_partind_=1"
  },
  inputs=all_inputs,
  nominals=class_vars,
  target="bad",
  kernel="POLYNOMIAL",
  degree=2,
  id={"bad", "_partind_"},
  savestate={"name":"svm_astore_model", "replace":True}
)

# Output model statistics
render_html(sv)

# Score using ASTORE
render_html(sess.astore.score(
  table={"name":"hmeq_part"},
  rstore={"name":"svm_astore_model"},
  out={"name":"_scored_svm", "replace":True}
))


NOTE: SVM training is activated.
NOTE: Wrote 10929 bytes to the savestate file svm_astore_model.
Model Information
Descr Value
Task TypeC_CLAS
Optimization TechniqueInterior Point
ScaleYES
Kernel FunctionPolynomial
Kernel Degree2
Penalty MethodC
Penalty Parameter1
Maximum Iterations25
Tolerance1e-06
Observations
Descr N
Number of Observations Read4172
Number of Observations Used3381
Training Results
Descr Value
Inner Product of Weights61.485621478
Bias-0.030902475
Total Slack (Constraint Violations)803.54947822
Norm of Longest Vector6.6144307957
Number of Support Vectors915
Number of Support Vectors on Margin820
Maximum F9.2242861052
Minimum F-3.384251501
Number of Effects13
Columns in Data Matrix20
Columns in Kernel Matrix231
Iteration History
Iteration Complementarity Feasibility
11002173.53493141665.2931
21146.17947062029.634243
3206.45156575220.05770528
427.3275225042.2005772E-6
57.41072179755.1757674E-7
64.39210992142.304127E-7
71.97819338848.3948865E-8
81.43708811714.7023301E-8
90.52080882531.0371652E-8
100.20039188151.9398065E-9
110.11731521138.713565E-10
120.06662362463.892865E-10
130.04256866271.864289E-10
140.03519307741.20305E-10
150.02295144636.909817E-11
160.01543259833.273337E-11
170.00991166561.692008E-11
180.00595533167.606138E-12
190.00357180343.314515E-12
200.00184839368.169576E-13
210.00080644432.411936E-13
220.00031758072.561203E-13
230.00007453251.394449E-13
246.4221743E-68.326673E-14
251.6987549E-78.384457E-13
Misclassification Matrix
Observed 0 1 Total
025891232712
1238431669
Total28275543381
Fit Statistics
Statistic Training
Accuracy0.893226856
Error0.106773144
Sensitivity0.9546460177
Specificity0.644245142
Task Timing
Task Seconds Percent
Loading the Store0.00001907350.0000763847
Creating the State0.02011585240.0805591361
Scoring0.22956418990.9193492023
Total0.24970293051

Assess Models


In [24]:
def assess_model(prefix):
    return sess.percentile.assess(
      table={
        "name":"_scored_" + prefix, 
        "where": "strip(put(_partind_, best.))='0'"
      },
      inputs=[{"name":"p_bad1"}],      
      response="bad",
      event="1",
      pVar={"p_bad0"},
      pEvent={"0"}      
    )

rfAssess=assess_model(prefix="rf")    
rf_fitstat =rfAssess.FitStat
rf_rocinfo =rfAssess.ROCInfo
rf_liftinfo=rfAssess.LIFTInfo

gbAssess=assess_model(prefix="gb")    
gb_fitstat =gbAssess.FitStat
gb_rocinfo =gbAssess.ROCInfo
gb_liftinfo=gbAssess.LIFTInfo

nnAssess=assess_model(prefix="nn")    
nn_fitstat =nnAssess.FitStat
nn_rocinfo =nnAssess.ROCInfo
nn_liftinfo=nnAssess.LIFTInfo

svmAssess=assess_model(prefix="svm")    
svm_fitstat =svmAssess.FitStat
svm_rocinfo =svmAssess.ROCInfo
svm_liftinfo=svmAssess.LIFTInfo

Create ROC and Lift plots (using Validation data)

Prepare assessment results for plotting


In [25]:
# Add new variable to indicate type of model
rf_liftinfo["model"]="Forest"
rf_rocinfo["model"]="Forest"
gb_liftinfo["model"]="GradientBoosting"
gb_rocinfo["model"]="GradientBoosting"
nn_liftinfo["model"]="NeuralNetwork"
nn_rocinfo["model"]="NeuralNetwork"
svm_liftinfo["model"]="SVM"
svm_rocinfo["model"]="SVM"

# Append data
all_liftinfo = rf_liftinfo.append(gb_liftinfo, ignore_index=True) \
    .append(nn_liftinfo, ignore_index=True) \
    .append(svm_liftinfo, ignore_index=True)  
all_rocinfo = rf_rocinfo.append(gb_rocinfo, ignore_index=True) \
    .append(nn_rocinfo, ignore_index=True) \
    .append(svm_rocinfo, ignore_index=True)

In [26]:
print("AUC (using validation data)".center(80, '-'))
all_rocinfo[["model", "C"]].drop_duplicates(keep="first").sort_values(by="C", ascending=False)


--------------------------AUC (using validation data)---------------------------
Out[26]:
model C
0 Forest 0.909909
100 GradientBoosting 0.878100
200 NeuralNetwork 0.875123
300 SVM 0.872880

Draw ROC and Lift plots


In [27]:
# Draw ROC charts 
plt.figure(figsize=(15,4))
for key, grp in all_rocinfo.groupby(["model"]):
    plt.plot(grp["FPR"], grp["Sensitivity"], label=key)
plt.plot([0,1], [0,1], "k--")
plt.xlabel("False Positive Rate")
plt.ylabel("True Positive Rate")
plt.grid(True)
plt.legend(loc="best")
plt.title("ROC Curve (using validation data)")
plt.show()

# Draw lift charts
plt.figure(figsize=(15,4))
for key, grp in all_liftinfo.groupby(["model"]):
    plt.plot(grp["Depth"], grp["Lift"], label=key)
plt.xlabel("Depth")
plt.ylabel("Lift")
plt.grid(True)
plt.legend(loc="best")
plt.title("Lift Chart (using validation data)")
plt.show()


End CAS session


In [28]:
sess.close()

In [ ]: